180 research outputs found
Relating toy models of quantum computation: comprehension, complementarity and dagger mix autonomous categories
Toy models have been used to separate important features of quantum
computation from the rich background of the standard Hilbert space model.
Category theory, on the other hand, is a general tool to separate components of
mathematical structures, and analyze one layer at a time. It seems natural to
combine the two approaches, and several authors have already pursued this idea.
We explore *categorical comprehension construction* as a tool for adding
features to toy models. We use it to comprehend quantum propositions and
probabilities within the basic model of finite-dimensional Hilbert spaces. We
also analyze complementary quantum observables over the category of sets and
relations. This leads into the realm of *test spaces*, a well-studied model. We
present one of many possible extensions of this model, enabled by the
comprehension construction. Conspicuously, all models obtained in this way
carry the same categorical structure, *extending* the familiar dagger compact
framework with the complementation operations. We call the obtained structure
*dagger mix autonomous*, because it extends mix autonomous categories, popular
in computer science, in a similar way like dagger compact structure extends
compact categories. Dagger mix autonomous categories seem to arise quite
naturally in quantum computation, as soon as complementarity is viewed as a
part of the global structure.Comment: 21 pages, 6 figures; Proceedings of Quantum Physics and Logic, Oxford
8-9 April 200
Gaming security by obscurity
Shannon sought security against the attacker with unlimited computational
powers: *if an information source conveys some information, then Shannon's
attacker will surely extract that information*. Diffie and Hellman refined
Shannon's attacker model by taking into account the fact that the real
attackers are computationally limited. This idea became one of the greatest new
paradigms in computer science, and led to modern cryptography.
Shannon also sought security against the attacker with unlimited logical and
observational powers, expressed through the maxim that "the enemy knows the
system". This view is still endorsed in cryptography. The popular formulation,
going back to Kerckhoffs, is that "there is no security by obscurity", meaning
that the algorithms cannot be kept obscured from the attacker, and that
security should only rely upon the secret keys. In fact, modern cryptography
goes even further than Shannon or Kerckhoffs in tacitly assuming that *if there
is an algorithm that can break the system, then the attacker will surely find
that algorithm*. The attacker is not viewed as an omnipotent computer any more,
but he is still construed as an omnipotent programmer.
So the Diffie-Hellman step from unlimited to limited computational powers has
not been extended into a step from unlimited to limited logical or programming
powers. Is the assumption that all feasible algorithms will eventually be
discovered and implemented really different from the assumption that everything
that is computable will eventually be computed? The present paper explores some
ways to refine the current models of the attacker, and of the defender, by
taking into account their limited logical and programming powers. If the
adaptive attacker actively queries the system to seek out its vulnerabilities,
can the system gain some security by actively learning attacker's methods, and
adapting to them?Comment: 15 pages, 9 figures, 2 tables; final version appeared in the
Proceedings of New Security Paradigms Workshop 2011 (ACM 2011); typos
correcte
Dynamics, robustness and fragility of trust
Trust is often conveyed through delegation, or through recommendation. This
makes the trust authorities, who process and publish trust recommendations,
into an attractive target for attacks and spoofing. In some recent empiric
studies, this was shown to lead to a remarkable phenomenon of *adverse
selection*: a greater percentage of unreliable or malicious web merchants were
found among those with certain types of trust certificates, then among those
without. While such findings can be attributed to a lack of diligence in trust
authorities, or even to conflicts of interest, our analysis of trust dynamics
suggests that public trust networks would probably remain vulnerable even if
trust authorities were perfectly diligent. The reason is that the process of
trust building, if trust is not breached too often, naturally leads to
power-law distributions: the rich get richer, the trusted attract more trust.
The evolutionary processes with such distributions, ubiquitous in nature, are
known to be robust with respect to random failures, but vulnerable to adaptive
attacks. We recommend some ways to decrease the vulnerability of trust
building, and suggest some ideas for exploration.Comment: 17 pages; simplified the statement and the proof of the main theorem;
FAST 200
Network as a computer: ranking paths to find flows
We explore a simple mathematical model of network computation, based on
Markov chains. Similar models apply to a broad range of computational
phenomena, arising in networks of computers, as well as in genetic, and neural
nets, in social networks, and so on. The main problem of interaction with such
spontaneously evolving computational systems is that the data are not uniformly
structured. An interesting approach is to try to extract the semantical content
of the data from their distribution among the nodes. A concept is then
identified by finding the community of nodes that share it. The task of data
structuring is thus reduced to the task of finding the network communities, as
groups of nodes that together perform some non-local data processing. Towards
this goal, we extend the ranking methods from nodes to paths. This allows us to
extract some information about the likely flow biases from the available static
information about the network.Comment: 12 pages, CSR 200
Geometry of abstraction in quantum computation
Quantum algorithms are sequences of abstract operations, performed on
non-existent computers. They are in obvious need of categorical semantics. We
present some steps in this direction, following earlier contributions of
Abramsky, Coecke and Selinger. In particular, we analyze function abstraction
in quantum computation, which turns out to characterize its classical
interfaces. Some quantum algorithms provide feasible solutions of important
hard problems, such as factoring and discrete log (which are the building
blocks of modern cryptography). It is of a great practical interest to
precisely characterize the computational resources needed to execute such
quantum algorithms. There are many ideas how to build a quantum computer. Can
we prove some necessary conditions? Categorical semantics help with such
questions. We show how to implement an important family of quantum algorithms
using just abelian groups and relations.Comment: 29 pages, 42 figures; Clifford Lectures 2008 (main speaker Samson
Abramsky); this version fixes a pstricks problem in a diagra
Chasing diagrams in cryptography
Cryptography is a theory of secret functions. Category theory is a general
theory of functions. Cryptography has reached a stage where its structures
often take several pages to define, and its formulas sometimes run from page to
page. Category theory has some complicated definitions as well, but one of its
specialties is taming the flood of structure. Cryptography seems to be in need
of high level methods, whereas category theory always needs concrete
applications. So why is there no categorical cryptography? One reason may be
that the foundations of modern cryptography are built from probabilistic
polynomial-time Turing machines, and category theory does not have a good
handle on such things. On the other hand, such foundational problems might be
the very reason why cryptographic constructions often resemble low level
machine programming. I present some preliminary explorations towards
categorical cryptography. It turns out that some of the main security concepts
are easily characterized through the categorical technique of *diagram
chasing*, which was first used Lambek's seminal `Lecture Notes on Rings and
Modules'.Comment: 17 pages, 4 figures; to appear in: 'Categories in Logic, Language and
Physics. Festschrift on the occasion of Jim Lambek's 90th birthday', Claudia
Casadio, Bob Coecke, Michael Moortgat, and Philip Scott (editors); this
version: fixed typos found by kind reader
Monoidal computer III: A coalgebraic view of computability and complexity
Monoidal computer is a categorical model of intensional computation, where
many different programs correspond to the same input-output behavior. The
upshot of yet another model of computation is that a categorical formalism
should provide a much needed high level language for theory of computation,
flexible enough to allow abstracting away the low level implementation details
when they are irrelevant, or taking them into account when they are genuinely
needed. A salient feature of the approach through monoidal categories is the
formal graphical language of string diagrams, which supports visual reasoning
about programs and computations.
In the present paper, we provide a coalgebraic characterization of monoidal
computer. It turns out that the availability of interpreters and specializers,
that make a monoidal category into a monoidal computer, is equivalent with the
existence of a *universal state space*, that carries a weakly final state
machine for any pair of input and output types. Being able to program state
machines in monoidal computers allows us to represent Turing machines, to
capture their execution, count their steps, as well as, e.g., the memory cells
that they use. The coalgebraic view of monoidal computer thus provides a
convenient diagrammatic language for studying computability and complexity.Comment: 34 pages, 24 figures; in this version: added the Appendi
Quantifying pervasive authentication: the case of the Hancke-Kuhn protocol
As mobile devices pervade physical space, the familiar authentication
patterns are becoming insufficient: besides entity authentication, many
applications require, e.g., location authentication. Many interesting protocols
have been proposed and implemented to provide such strengthened forms of
authentication, but there are very few proofs that such protocols satisfy the
required security properties. The logical formalisms, devised for reasoning
about security protocols on standard computer networks, turn out to be
difficult to adapt for reasoning about hybrid protocols, used in pervasive and
heterogenous networks.
We refine the Dolev-Yao-style algebraic method for protocol analysis by a
probabilistic model of guessing, needed to analyze protocols that mix weak
cryptography with physical properties of nonstandard communication channels.
Applying this model, we provide a precise security proof for a proximity
authentication protocol, due to Hancke and Kuhn, that uses a subtle form of
probabilistic reasoning to achieve its goals.Comment: 31 pages, 2 figures; short version of this paper appeared in the
Proceedings of MFPS 201
Towards concept analysis in categories: limit inferior as algebra, limit superior as coalgebra
While computer programs and logical theories begin by declaring the concepts
of interest, be it as data types or as predicates, network computation does not
allow such global declarations, and requires *concept mining* and *concept
analysis* to extract shared semantics for different network nodes. Powerful
semantic analysis systems have been the drivers of nearly all paradigm shifts
on the web. In categorical terms, most of them can be described as
bicompletions of enriched matrices, generalizing the Dedekind-MacNeille-style
completions from posets to suitably enriched categories. Yet it has been well
known for more than 40 years that ordinary categories themselves in general do
not permit such completions. Armed with this new semantical view of
Dedekind-MacNeille completions, and of matrix bicompletions, we take another
look at this ancient mystery. It turns out that simple categorical versions of
the *limit superior* and *limit inferior* operations characterize a general
notion of Dedekind-MacNeille completion, that seems to be appropriate for
ordinary categories, and boils down to the more familiar enriched versions when
the limits inferior and superior coincide. This explains away the apparent gap
among the completions of ordinary categories, and broadens the path towards
categorical concept mining and analysis, opened in previous work.Comment: 22 pages, 5 figures and 9 diagram
- ā¦